# Multi-script support
Nllb
NLLB-200 distilled version with 600M parameters, supporting machine translation tasks for 200 languages
Machine Translation
Transformers Supports Multiple Languages

N
Narsil
113
2
Roberta Base Serbian
This is a Serbian (Cyrillic and Latin scripts) RoBERTa model pretrained on srWaC, suitable for downstream task fine-tuning.
Large Language Model
Transformers Other

R
KoichiYasuoka
20
1
Opus Tatoeba En Ja
Apache-2.0
An English-Japanese translation model based on the transformer-align architecture, utilizing standardized and SentencePiece preprocessing, released by the Helsinki-NLP team.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
1,851
13
Opus Mt Ja Hu
Apache-2.0
This is a Japanese-to-Hungarian machine translation model based on the transformer-align architecture, released by the Tatoeba-Challenge project.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
33
2
Opus Mt Ja Tr
Apache-2.0
This is a Japanese-to-Turkish machine translation model based on the transformer-align architecture, developed by the Helsinki-NLP team.
Machine Translation
Transformers Supports Multiple Languages

O
Helsinki-NLP
136
0
Featured Recommended AI Models